Goto

Collaborating Authors

 offensive weapon


Sky battles: Fighting back against rogue drones

BBC News

Rogue drones have nearly caused air accidents, have been used as offensive weapons, to deliver drugs to prisoners, and to spy on people. So how can we fight back? This summer a packed Airbus A321 came within 100ft (30m) of disaster after encountering a drone at 15,500ft. And the number of near-misses of this sort has trebled over the last three years, with 92 incidents reported last year in the UK alone. Dozens were classified as involving a serious chance of a collision.


Should AI researchers kill people?

#artificialintelligence

AI research is increasingly being used by militaries around the world for offensive and defensive applications. This past week, groups of AI researchers began to fight back against two separate programs located halfway around the world from each other, generating tough questions about just how much engineers can affect the future uses of these technologies. From Silicon Valley, the New York Times published an internal protest memo written by several thousand Google employees, which vociferously opposed Google's work on a Defense Department-led initiative called Project Maven, which aims to use computer vision algorithms to analyze vast troves of image and video data. As the department's news service quoted Marine Corps Col. Drew Cukor last year about the initiative: "You don't buy AI like you buy ammunition," he added. "There's a deliberate workflow process and what the department has given us with its rapid acquisition authorities is an opportunity for about 36 months to explore what is governmental and [how] best to engage industry [to] advantage the taxpayer and the warfighter, who wants the best algorithms that exist to augment and complement the work he does."